General properties of general Bayesian learning
نویسندگان
چکیده
We investigate the general properties of general Bayesian learning, where “general Bayesian learning” means inferring a state from another that is regarded as evidence, and where the inference is conditionalizing the evidence using the conditional expectation determined by a reference probability measure representing the background subjective degrees of belief of a Bayesian Agent performing the inference. States are linear functionals that encode probability measures by assigning expectation values to random variables via integrating them with respect to the probability measure. If a state can be learned from another this way, then it is said to be Bayes accessible from the evidence. It is shown that the Bayes accessibility relation is reflexive, antisymmetric and non-transitive. If every state is Bayes accessible from some other defined on the same set of random variables, then the set of states is called weakly Bayes connected. It is shown that the set of states is not weakly Bayes connected if the probability space is standard. The set of states is called weakly Bayes connectable if, given any state, the probability space can be extended in such a way that the given state becomes Bayes accessible from some other state in the extended space. It is shown that probability spaces are weakly Bayes connectable. Since conditioning using the theory of conditional expectations includes both Bayes’ rule and Jeffrey conditionalization as special cases, the results presented generalize substantially some results obtained earlier for Jeffrey conditionalization. 1 Review of main results In this paper we investigate the general properties of general Bayesian learning. By “general Bayesian learning” we mean inferring a probability measure from another that is regarded as evidence, and where the inference is conditionalizing the probability measure representing the evidence using the conditional expectation determined by a reference probability measure that is interpreted as representing the background subjective degrees of belief of a Bayesian Agent performing the inference. The investigation is motivated by the observation that the properties of Bayesian learning we wish to determine do not seem to have been analyzed in the literature on Bayesianism on the level of generality we aim at here. (For monographic works on Bayesianism we refer to [22], [3], [44]; for papers discussing basic aspects of Bayesianism see [21], [19], [20]; the recent paper by Weisberg [43] provides a compact review of Bayesianism). In particular, in this paper we take the position that the proper general technical device to perform Bayesian conditioning is the theory of conditional ∗MTA Rényi Institute of Mathematics, Budapest, Hungary, [email protected] †Department of Philosophy, Logic and Scientific Method, London School of Economics and Political Science, Houghton Street, London WC2A 2AE, UK, [email protected]
منابع مشابه
Inverse Problems in Imaging Systems and the General Bayesian Inversion Frawework
In this paper, first a great number of inverse problems which arise in instrumentation, in computer imaging systems and in computer vision are presented. Then a common general forward modeling for them is given and the corresponding inversion problem is presented. Then, after showing the inadequacy of the classical analytical and least square methods for these ill posed inverse problems, a Baye...
متن کاملE-Bayesian Approach in A Shrinkage Estimation of Parameter of Inverse Rayleigh Distribution under General Entropy Loss Function
Whenever approximate and initial information about the unknown parameter of a distribution is available, the shrinkage estimation method can be used to estimate it. In this paper, first the $ E $-Bayesian estimation of the parameter of inverse Rayleigh distribution under the general entropy loss function is obtained. Then, the shrinkage estimate of the inverse Rayleigh distribution parameter i...
متن کاملSupervised classification using MCMC methods
This paper addresses the problem of supervised classification using general Bayesian learning. General Bayesian learning consists of estimating the unknown class-conditional densities from a set of labelled samples. However, the estimation requires to evaluate intractable multidimensional integrals. This paper studies an implementation of general Bayesian learning based on MCMC methods.
متن کاملDesigning an Optimal Pattern of General Medical Course Curriculum: an Effective Step in Enhancing How to Learn
Introduction: In today's world with a vast amount of information and knowledge, medical students should learn how to become effective physicians. Therefore, the competencies required for lifelong learning in the curriculum must be considered. The purpose of this study was to present a desirable general medical curriculum with emphasis on lifelong learning. Methods: The present study was Mixe...
متن کاملBayesian Data Fusion: a Reliable Approach for Descriptive Modeling of Ore Deposits
Recognition of ore deposit genesis is still a controversial challenge for economic geologists. Here, this task was addressed by the virtue of Bayesian data fusion (BDF) implementing available proofs: semi-schematic examples with two (Cu and Pb + Zn) and three (Cu, Pb + Zn and Ag) evidences. The data, in current paper are just concentrations of indicated elements, were collected from Angouran’s ...
متن کامل